search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN Art. 3 cercato: 'terms and conditions' . Output generated live by software developed by IusOnDemand srl


expand index terms and conditions:


whereas terms and conditions:


definitions:


cloud tag: and the number of total unique words without stopwords is: 1012

 

Article 3

Definitions

For the purpose of this Regulation, the following definitions shall apply:

(a)

information_society_servicemeans a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

(b)

recipient_of_the_servicemeans any natural or legal person who uses an intermediary_service, in particular for the purposes of seeking information or making it accessible;

(c)

consumermeans any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

(d)

to_offer_services_in_the_Unionmeans enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary_services that has a substantial_connection_to_the_Union;

(e)

substantial_connection_to_the_Unionmeans a connection of a provider of intermediary_services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as:

a significant number of recipients of the service in one or more Member States in relation to its or their population; or

the targeting of activities towards one or more Member States;

(f)

tradermeans any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(g)

intermediary_servicemeans one of the following information_society_services:

(i)

a ‘ mere_conduitservice, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network;

(ii)

a ‘ cachingservice, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

(iii)

a ‘ hostingservice, consisting of the storage of information provided by, and at the request of, a recipient_of_the_service;

(h)

illegal_contentmeans any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(i)

online_platformmeans a hosting service that, at the request of a recipient_of_the_service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(j)

online_search_enginemeans an intermediary_service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(k)

dissemination_to_the_publicmeans making information available, at the request of the recipient_of_the_service who provided the information, to a potentially unlimited number of third parties;

(l)

distance_contractmeansdistance_contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

(m)

online_interfacemeans any software, including a website or a part thereof, and applications, including mobile applications;

(n)

Digital_Services_Coordinator_of_establishmentmeans the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary_service is located or its legal representative resides or is established;

(o)

Digital_Services_Coordinator_of_destinationmeans the Digital Services Coordinator of a Member State where the intermediary_service is provided;

(p)

‘active recipient of an online_platformmeans a recipient_of_the_service that has engaged with an online_platform by either requesting the online_platform to host information or being exposed to information hosted by the online_platform and disseminated through its online_interface;

(q)

‘active recipient of an online_search_enginemeans a recipient_of_the_service that has submitted a query to an online_search_engine and been exposed to information indexed and presented on its online_interface;

(r)

advertisementmeans information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online_platform on its online_interface against remuneration specifically for promoting that information;

(s)

recommender_systemmeans a fully or partially automated system used by an online_platform to suggest in its online_interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient_of_the_service or otherwise determining the relative order or prominence of information displayed;

(t)

content_moderationmeans the activities, whether automated or not, undertaken by providers of intermediary_services, that are aimed, in particular, at detecting, identifying and addressing illegal_content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal_content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

(u)

terms and conditionsmeans all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary_services and the recipients of the service;

(v)

persons_with_disabilitiesmeanspersons_with_disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council (38);

(w)

commercial_communicationmeanscommercial_communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

(x)

turnovermeans the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004 (39).

CHAPTER II

LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

Article 3

Definitions

For the purpose of this Regulation, the following definitions shall apply:

(a)

information_society_service’ means a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

(b)

recipient_of_the_service’ means any natural or legal person who uses an intermediary_service, in particular for the purposes of seeking information or making it accessible;

(c)

consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

(d)

to_offer_services_in_the_Union’ means enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary_services that has a substantial_connection_to_the_Union;

(e)

substantial_connection_to_the_Union’ means a connection of a provider of intermediary_services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as:

a significant number of recipients of the service in one or more Member States in relation to its or their population; or

the targeting of activities towards one or more Member States;

(f)

trader’ means any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(g)

intermediary_service’ means one of the following information_society_services:

(i)

a ‘ mere_conduit’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network;

(ii)

a ‘ caching’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

(iii)

a ‘ hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient_of_the_service;

(h)

illegal_content’ means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(i)

online_platform’ means a hosting service that, at the request of a recipient_of_the_service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(j)

online_search_engine’ means an intermediary_service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(k)

dissemination_to_the_public’ means making information available, at the request of the recipient_of_the_service who provided the information, to a potentially unlimited number of third parties;

(l)

distance_contract’ means ‘ distance_contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

(m)

online_interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

(n)

Digital_Services_Coordinator_of_establishment’ means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary_service is located or its legal representative resides or is established;

(o)

Digital_Services_Coordinator_of_destination’ means the Digital Services Coordinator of a Member State where the intermediary_service is provided;

(p)

‘active recipient of an online_platform’ means a recipient_of_the_service that has engaged with an online_platform by either requesting the online_platform to host information or being exposed to information hosted by the online_platform and disseminated through its online_interface;

(q)

‘active recipient of an online_search_engine’ means a recipient_of_the_service that has submitted a query to an online_search_engine and been exposed to information indexed and presented on its online_interface;

(r)

advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online_platform on its online_interface against remuneration specifically for promoting that information;

(s)

recommender_system’ means a fully or partially automated system used by an online_platform to suggest in its online_interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient_of_the_service or otherwise determining the relative order or prominence of information displayed;

(t)

content_moderation’ means the activities, whether automated or not, undertaken by providers of intermediary_services, that are aimed, in particular, at detecting, identifying and addressing illegal_content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal_content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

(u)

terms and conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary_services and the recipients of the service;

(v)

persons_with_disabilities’ means ‘ persons_with_disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council (38);

(w)

commercial_communication’ means ‘ commercial_communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

(x)

turnover’ means the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004 (39).

CHAPTER II

LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms and conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms and conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms and conditions in the official languages of all the Member States in which they offer their services.

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 17

Statement of reasons

1.   Providers of hosting services shall provide a clear and specific statement of reasons to any affected recipients of the service for any of the following restrictions imposed on the ground that the information provided by the recipient_of_the_service is illegal_content or incompatible with their terms and conditions:

(a)

any restrictions of the visibility of specific items of information provided by the recipient_of_the_service, including removal of content, disabling access to content, or demoting content;

(b)

suspension, termination or other restriction of monetary payments;

(c)

suspension or termination of the provision of the service in whole or in part;

(d)

suspension or termination of the recipient_of_the_service's account.

2.   Paragraph 1 shall only apply where the relevant electronic contact details are known to the provider. It shall apply at the latest from the date that the restriction is imposed, regardless of why or how it was imposed.

Paragraph 1 shall not apply where the information is deceptive high-volume commercial content.

3.   The statement of reasons referred to in paragraph 1 shall at least contain the following information:

(a)

information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the information, or the suspension or termination of monetary payments related to that information, or imposes other measures referred to in paragraph 1 with regard to the information, and, where relevant, the territorial scope of the decision and its duration;

(b)

the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted in accordance with Article 16 or based on voluntary own-initiative investigations and, where strictly necessary, the identity of the notifier;

(c)

where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of content detected or identified using automated means;

(d)

where the decision concerns allegedly illegal_content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal_content on that ground;

(e)

where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider of hosting services, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;

(f)

clear and user-friendly information on the possibilities for redress available to the recipient_of_the_service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.

4.   The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient_of_the_service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).

5.   This Article shall not apply to any orders referred to in Article 9.

Article 20

Internal complaint-handling system

1.   Providers of online_platforms shall provide recipients of the service, including individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, with access to an effective internal complaint-handling system that enables them to lodge complaints, electronically and free of charge, against the decision taken by the provider of the online_platform upon the receipt of a notice or against the following decisions taken by the provider of the online_platform on the grounds that the information provided by the recipients constitutes illegal_content or is incompatible with its terms and conditions:

(a)

decisions whether or not to remove or disable access to or restrict visibility of the information;

(b)

decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;

(c)

decisions whether or not to suspend or terminate the recipients’ account;

(d)

decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients.

2.   The period of at least six months referred to in paragraph 1 of this Article shall start on the day on which the recipient_of_the_service is informed about the decision in accordance with Article 16(5) or Article 17.

3.   Providers of online_platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.

4.   Providers of online_platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner. Where a complaint contains sufficient grounds for the provider of the online_platform to consider that its decision not to act upon the notice is unfounded or that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the measure taken, it shall reverse its decision referred to in paragraph 1 without undue delay.

5.   Providers of online_platforms shall inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 and other available possibilities for redress.

6.   Providers of online_platforms shall ensure that the decisions, referred to in paragraph 5, are taken under the supervision of appropriately qualified staff, and not solely on the basis of automated means.

Article 21

Out-of-court dispute settlement

1.   Recipients of the service, including individuals or entities that have submitted notices, addressed by the decisions referred to in Article 20(1) shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 3 of this Article in order to resolve disputes relating to those decisions, including complaints that have not been resolved by means of the internal complaint-handling system referred to in that Article.

Providers of online_platforms shall ensure that information about the possibility for recipients of the service to have access to an out-of-court dispute settlement, as referred to in the first subparagraph, is easily accessible on their online_interface, clear and user-friendly.

The first subparagraph is without prejudice to the right of the recipient_of_the_service concerned to initiate, at any stage, proceedings to contest those decisions by the providers of online_platforms before a court in accordance with the applicable law.

2.   Both parties shall engage, in good faith, with the selected certified out-of-court dispute settlement body with a view to resolving the dispute.

Providers of online_platforms may refuse to engage with such out-of-court dispute settlement body if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content.

The certified out-of-court dispute settlement body shall not have the power to impose a binding settlement of the dispute on the parties.

3.   The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, for a maximum period of five years, which may be renewed, certify the body, at its request, where the body has demonstrated that it meets all of the following conditions:

(a)

it is impartial and independent, including financially independent, of providers of online_platforms and of recipients of the service provided by providers of online_platforms, including of individuals or entities that have submitted notices;

(b)

it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal_content, or in relation to the application and enforcement of terms and conditions of one or more types of online_platform, allowing the body to contribute effectively to the settlement of a dispute;

(c)

its members are remunerated in a way that is not linked to the outcome of the procedure;

(d)

the out-of-court dispute settlement that it offers is easily accessible, through electronic communications technology and provides for the possibility to initiate the dispute settlement and to submit the requisite supporting documents online;

(e)

it is capable of settling disputes in a swift, efficient and cost-effective manner and in at least one of the official languages of the institutions of the Union;

(f)

the out-of-court dispute settlement that it offers takes place in accordance with clear and fair rules of procedure that are easily and publicly accessible, and that comply with applicable law, including this Article.

The Digital Services Coordinator shall, where applicable, specify in the certificate:

(a)

the particular issues to which the body’s expertise relates, as referred to in point (b) of the first subparagraph; and

(b)

the official language or languages of the institutions of the Union in which the body is capable of settling disputes, as referred to in point (e) of the first subparagraph.

4.   Certified out-of-court dispute settlement bodies shall report to the Digital Services Coordinator that certified them, on an annual basis, on their functioning, specifying at least the number of disputes they received, the information about the outcomes of those disputes, the average time taken to resolve them and any shortcomings or difficulties encountered. They shall provide additional information at the request of that Digital Services Coordinator.

Digital Services Coordinators shall, every two years, draw up a report on the functioning of the out-of-court dispute settlement bodies that they certified. That report shall in particular:

(a)

list the number of disputes that each certified out-of-court dispute settlement body has received annually;

(b)

indicate the outcomes of the procedures brought before those bodies and the average time taken to resolve the disputes;

(c)

identify and explain any systematic or sectoral shortcomings or difficulties encountered in relation to the functioning of those bodies;

(d)

identify best practices concerning that functioning;

(e)

make recommendations as to how to improve that functioning, where appropriate.

Certified out-of-court dispute settlement bodies shall make their decisions available to the parties within a reasonable period of time and no later than 90 calendar days after the receipt of the complaint. In the case of highly complex disputes, the certified out-of-court dispute settlement body may, at its own discretion, extend the 90 calendar day period for an additional period that shall not exceed 90 days, resulting in a maximum total duration of 180 days.

5.   If the out-of-court dispute settlement body decides the dispute in favour of the recipient_of_the_service, including the individual or entity that has submitted a notice, the provider of the online_platform shall bear all the fees charged by the out-of-court dispute settlement body, and shall reimburse that recipient, including the individual or entity, for any other reasonable expenses that it has paid in relation to the dispute settlement. If the out-of-court dispute settlement body decides the dispute in favour of the provider of the online_platform, the recipient_of_the_service, including the individual or entity, shall not be required to reimburse any fees or other expenses that the provider of the online_platform paid or is to pay in relation to the dispute settlement, unless the out-of-court dispute settlement body finds that that recipient manifestly acted in bad faith.

The fees charged by the out-of-court dispute settlement body to the providers of online_platforms for the dispute settlement shall be reasonable and shall in any event not exceed the costs incurred by the body. For recipients of the service, the dispute settlement shall be available free of charge or at a nominal fee.

Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient_of_the_service, including to the individuals or entities that have submitted a notice, and to the provider of the online_platform concerned, before engaging in the dispute settlement.

6.   Member States may establish out-of-court dispute settlement bodies for the purposes of paragraph 1 or support the activities of some or all out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3.

Member States shall ensure that any of their activities undertaken under the first subparagraph do not affect the ability of their Digital Services Coordinators to certify the bodies concerned in accordance with paragraph 3.

7.   A Digital Services Coordinator that has certified an out-of-court dispute settlement body shall revoke that certification if it determines, following an investigation either on its own initiative or on the basis of the information received by third parties, that the out-of-court dispute settlement body no longer meets the conditions set out in paragraph 3. Before revoking that certification, the Digital Services Coordinator shall afford that body an opportunity to react to the findings of its investigation and its intention to revoke the out-of-court dispute settlement body’s certification.

8.   Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3, including where applicable the specifications referred to in the second subparagraph of that paragraph, as well as the out-of-court dispute settlement bodies the certification of which they have revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website that is easily accessible, and keep it up to date.

9.   This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms and conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 27

Recommender system transparency

1.   Providers of online_platforms that use recommender_systems shall set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender_systems, as well as any options for the recipients of the service to modify or influence those main parameters.

2.   The main parameters referred to in paragraph 1 shall explain why certain information is suggested to the recipient_of_the_service. They shall include, at least:

(a)

the criteria which are most significant in determining the information suggested to the recipient_of_the_service;

(b)

the reasons for the relative importance of those parameters.

3.   Where several options are available pursuant to paragraph 1 for recommender_systems that determine the relative order of information presented to recipients of the service, providers of online_platforms shall also make available a functionality that allows the recipient_of_the_service to select and to modify at any time their preferred option. That functionality shall be directly and easily accessible from the specific section of the online_platform’s online_interface where the information is being prioritised.

Article 34

Risk assessment

1.   Providers of very large online_platforms and of very large online_search_engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.

They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

(a)

the dissemination of illegal_content through their services;

(b)

any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;

(c)

any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

(d)

any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

2.   When conducting risk assessments, providers of very large online_platforms and of very large online_search_engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:

(a)

the design of their recommender_systems and any other relevant algorithmic system;

(b)

their content_moderation systems;

(c)

the applicable terms and conditions and their enforcement;

(d)

systems for selecting and presenting advertisements;

(e)

data related practices of the provider.

The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal_content and of information that is incompatible with their terms and conditions.

The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

3.   Providers of very large online_platforms and of very large online_search_engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital_Services_Coordinator_of_establishment.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms and conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 39

Additional online advertising transparency

1.   Providers of very large online_platforms or of very large online_search_engines that present advertisements on their online_interfaces shall compile and make publicly available in a specific section of their online_interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online_interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.

2.   The repository shall include at least all of the following information:

(a)

the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;

(b)

the natural or legal person on whose behalf the advertisement is presented;

(c)

the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);

(d)

the period during which the advertisement was presented;

(e)

whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;

(f)

the commercial_communications published on the very large online_platforms and identified pursuant to Article 26(2);

(g)

the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted.

3.   As regards paragraph 2, points (a), (b) and (c), where a provider of very large online_platform or of very large online_search_engine has removed or disabled access to a specific advertisement based on alleged illegality or incompatibility with its terms and conditions, the repository shall not include the information referred to in those points. In such case, the repository shall include, for the specific advertisement concerned, the information referred to in Article 17(3), points (a) to (e), or Article 9(2), point (a)(i), as applicable.

The Commission may, after consultation of the Board, the relevant vetted researchers referred to in Article 40 and the public, issue guidelines on the structure, organisation and functionalities of the repositories referred to in this Article.

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms and conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.


whereas









keyboard_arrow_down